# Multi-Checkpoint Tracking

Pythia 1b
Apache-2.0
Pythia-1B is a language model specialized for interpretability research developed by EleutherAI, belonging to the 1-billion-parameter version in the Pythia suite, trained on The Pile dataset.
Large Language Model Transformers English
P
EleutherAI
79.69k
38
Pythia 12b
Apache-2.0
Pythia-12B is the largest model in EleutherAI's scalable language model suite, with 12 billion parameters, specifically designed to advance scientific research on large language models
Large Language Model Transformers English
P
EleutherAI
9,938
136
Pythia 6.9b
Apache-2.0
Pythia-6.9B is a large-scale language model developed by EleutherAI, part of the Pythia scalable suite, specifically designed to facilitate interpretability research.
Large Language Model Transformers English
P
EleutherAI
46.72k
54
Pythia 1b Deduped
Apache-2.0
Pythia-1B Deduplicated is a language model developed by EleutherAI specifically for interpretability research, trained on the deduplicated Pile dataset using Transformer architecture with 1 billion parameters
Large Language Model Transformers English
P
EleutherAI
19.89k
19
Pythia 410m
Apache-2.0
Pythia is a series of causal language models developed by EleutherAI, specifically designed for interpretability research. It includes 8 model sizes ranging from 70 million to 12 billion parameters, providing 154 training checkpoints.
Large Language Model Transformers English
P
EleutherAI
83.28k
25
Pythia 1.4b
Apache-2.0
Pythia-1.4B is a 1.2 billion parameter causal language model developed by EleutherAI, part of the Pythia scale suite, specifically designed for interpretability research.
Large Language Model Transformers English
P
EleutherAI
60.98k
23
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase